On the Courtade-Kumar conjecture for certain classes of Boolean functions

نویسنده

  • Septimia Sarbu
چکیده

We prove the Courtade-Kumar conjecture, for certain classes of n-dimensional Boolean functions, ∀n ≥ 2 and for all values of the error probability of the binary symmetric channel, ∀0 ≤ p ≤ 1 2 . Let X = [X1 . . . Xn] be a vector of independent and identically distributed Bernoulli( 1 2 ) random variables, which are the input to a memoryless binary symmetric channel, with the error probability in the interval 0 ≤ p ≤ 1 2 , and Y = [Y1 . . . Yn] the corresponding output. Let f : {0, 1}n → {0, 1} be an n-dimensional Boolean function. Then, the Courtade-Kumar conjecture states that the mutual information MI(f(X),Y) ≤ 1−H(p), where H(p) is the binary entropy function.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The "Most informative boolean function" conjecture holds for high noise

We prove the ”Most informative boolean function” conjecture of Courtade and Kumar for high noise ǫ ≥ 1/2− δ, for some absolute constant δ > 0. Namely, ifX is uniformly distributed in {0, 1}n and Y is obtained by flipping each coordinate of X independently with probability ǫ, then, provided ǫ ≥ 1/2− δ, for any boolean function f holds I ( f(X);Y ) ≤ 1 − H(ǫ). This conjecture was previously known...

متن کامل

The maximum mutual information between the output of a discrete symmetric channel and several classes of Boolean functions of its input

We prove the Courtade-Kumar conjecture, for several classes of n-dimensional Boolean functions, for all n ≥ 2 and for all values of the error probability of the binary symmetric channel, 0 ≤ p ≤ 1 2 . This conjecture states that the mutual information between any Boolean function of an n-dimensional vector of independent and identically distributed inputs to a memoryless binary symmetric channe...

متن کامل

Remarks on the Most Informative Function Conjecture at fixed mean

In 2013, Courtade and Kumar posed the following problem: Let x ∼ {±1}n be uniformly random, and form y ∼ {±1}n by negating each bit of x independently with probability α. Is it true that the mutual information I(f(x) ; y) is maximized among f : {±1}n → {±1} by f(x) = x1? We do not resolve this problem. Instead, we resolve the analogous problem in the settings of Gaussian space and the sphere. O...

متن کامل

Dictatorship is the Most Informative Balanced Function at the Extremes

Suppose X is a uniformly distributed n-dimensional binary vector and Y is obtained by passing X through a binary symmetric channel with crossover probability α. A recent conjecture by Courtade and Kumar postulates that I(f(X);Y ) ≤ 1− h(α) for any Boolean function f . In this paper, we prove the conjecture for all α ∈ [0, αn], and under the restriction to balanced functions, also for all α ∈ [1...

متن کامل

Boolean functions: noise stability, non-interactive correlation, and mutual information

Let ǫ ∈ [0, 1/2] be the noise parameter and p > 1. We study the isoperimetric problem that for fixed mean Ef which Boolean function f maximizes the p-th moment E(Tǫf) p of the noise operator Tǫ acting on Boolean functions f : {0, 1} n 7→ {0, 1}. Our findings are: in the low noise scenario, i.e., ǫ is small, the maximum is achieved by the lexicographical function; in the high noise scenario, i.e...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1702.03953  شماره 

صفحات  -

تاریخ انتشار 2017